SKIP TO CONTENT

western United States

Definitions of western United States
  1. noun
    the region of the United States lying to the west of the Mississippi River
    synonyms: West
    see moresee less
    examples:
    Wild West
    the western United States during its frontier period
    example of:
Cite this entry
Style:
MLA
  • MLA
  • APA
  • Chicago

Copy citation
DISCLAIMER: These example sentences appear in various news sources and books to reflect the usage of the word ‘western United States'. Views expressed in the examples do not represent the opinion of Vocabulary.com or its editors. Send us feedback
Word Family